python: C:/Users/AppData/Local/r-miniconda/envs/r-reticulate/python.exe
libpython: C:/Users/AppData/Local/r-miniconda/envs/r-reticulate/python36.dll
pythonhome: C:/Users/AppData/Local/r-miniconda/envs/r-reticulate
version: 3.6.13 (default, Sep 23 2021, 07:38:49) [MSC v.1916 64 bit (AMD64)]
Architecture: 64bit
numpy: C:/Users/AppData/Local/r-miniconda/envs/r-reticulate/Lib/site-packages/numpy
numpy_version: 1.19.5
NOTE: Python version was forced by use_python function
Import time series data
setwd('C:/Users/Desktop/Research/Soil_forcast')
Warning: The working directory was changed to C:/Users/Desktop/Research/Soil_forcast inside a notebook chunk. The working directory will be reset when the chunk is finished running. Use the knitr root.dir option in the setup chunk to change the working directory for notebook chunks.
###BenHur
b<-read.csv('BenHur.csv')
b<-b[6:2210,] #Start at midnight
###Chase
ch<-read.csv('Chase.csv')
ch<-ch[1:8808,] #Focus on the first year
Functional analysis, such as ARIMA were introduced in the 20th century, and are still widely used today. However, there are other ways to make short term predictions within a system. Certain physical phenomena, which are traditionally modeled deterministic can have added disturbances in the form of stochastic noise. Such noise can introduce Chaos, which is defined as sensitivity to initial conditions, and can cause drastically different trajectories through time. Stochastic noise and Chaos make it practically impossible to have absolute certainty for the long-term trajectory of any process. However, it is still possible to forecast the near future.
We will look at the Space Identification of Non-linear dynamics (SINDy) on time delay embedding and compare it to the forecast of ARIMA. We will also use the Dynamical Mode Decomposition (DMD) Algorithm, which can also be used to forecast time series using a stationary transformation of the time series. SINDy is a type of sparse regression, that identifies the important dynamics of a chaotic system. Time delay embedding can be produced using Taken’s Theorem. It is shown that Soil Temperature in particular exhibits chaos and exhibits an attractor within the time delay coordinates. This produces a differential equation, that recreates the system, and can be used for simulation and forecasting.
###Benhur max wind speed###
for (i in 1:9){
if (i != 8){
graphics.off()
lag.plot(b$Avg.Wind.Speed..mph.,lag=i)
}
}
soil_1<-b$Avg.Soil.Temp..F. #Average Soil Temperature in F
###Benhur average soil temp s=1
s<-1
for (j in 1:9){
if (j != 8){
graphics.off()
lag.plot(get(paste('soil_',s,sep = '')),lag=j)
}
}
Functional analysis: ARIMA method
Look at fourier frequency and maybe box.cox
###BenHur###
options(warn=-1) #make warnings shut up
chart.Correlation(b[,c(2:9)], histogram=F, pch=19,method = c('spearman'))
air_1<-b$Avg.Air.Temp..F. #Average Air Temperature in F
###Benhur at a week
autoplot(ts(soil_1[1:168])) + ylab('Average Soil Temperature') + xlab('Hours') + ggtitle('Benhur: 5/30/2021 12:57AM - 6/6/2021 2:57 AM')
#Plot Soil Temp and look at lagged differences
autoplot(ts(soil_1)) + ylab('Average Soil Temperature') + xlab('Hours') + ggtitle('Benhur: 5/30/2021 12:57AM - 8/29/2021 11:57PM')
ts(soil_1) %>% diff(lag=24) %>% ggtsdisplay()
#Plot Air Temp and look at lagged differences
autoplot(ts(air_1)) + ylab('Average Air Temperature') + xlab('Hours') + ggtitle('Benhur: 5/30/2021 12:57AM - 8/29/2021 11:57PM')
ts(air_1) %>% diff(lag=24) %>% ggtsdisplay()
#Plot Soil Temp with cox cox and look at lagged differences
autoplot(ts(BoxCox(soil_1, lambda = 'auto'))) + ylab('Average Soil Temperature') + xlab('Hours') + ggtitle('Benhur: 5/30/2021 12:57AM - 8/29/2021 11:57PM')
ts(BoxCox(soil_1, lambda = 'auto')) %>% diff(lag=24) %>% ggtsdisplay()
#Construct ARIMA with regression
soil_f1<-auto.arima(soil_1[1657:2157],xreg = air_1[1657:2157] ,stationary = FALSE,seasonal=TRUE)
soil_f1 #model
Series: soil_1[1657:2157]
Regression with ARIMA(2,1,3) errors
Coefficients:
ar1 ar2 ma1 ma2 ma3 xreg
1.7903 -0.8703 -0.6644 -0.3579 0.0867 -0.0161
s.e. 0.0287 0.0284 0.0566 0.0499 0.0530 0.0045
sigma^2 estimated as 0.04672: log likelihood=57.98
AIC=-101.96 AICc=-101.73 BIC=-72.46
coeftest(soil_f1) #Ljung-Box test
z test of coefficients:
Estimate Std. Error z value Pr(>|z|)
ar1 1.7903114 0.0287489 62.2740 < 2.2e-16 ***
ar2 -0.8703486 0.0283912 -30.6556 < 2.2e-16 ***
ma1 -0.6643946 0.0565717 -11.7443 < 2.2e-16 ***
ma2 -0.3579370 0.0499325 -7.1684 7.587e-13 ***
ma3 0.0867142 0.0530341 1.6351 0.102035
xreg -0.0160732 0.0045496 -3.5329 0.000411 ***
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
checkresiduals(soil_f1) #residuals
Ljung-Box test
data: Residuals from Regression with ARIMA(2,1,3) errors
Q* = 7.5337, df = 4, p-value = 0.1102
Model df: 6. Total lags used: 10
soil_f2<-auto.arima(soil_1[1657:2157],stationary = FALSE, seasonal=TRUE)
soil_f2 #model
Series: soil_1[1657:2157]
ARIMA(2,1,3)
Coefficients:
ar1 ar2 ma1 ma2 ma3
1.7884 -0.8690 -0.7027 -0.3211 0.0862
s.e. 0.0291 0.0287 0.0557 0.0515 0.0531
sigma^2 estimated as 0.04776: log likelihood=51.99
AIC=-91.97 AICc=-91.8 BIC=-66.69
coeftest(soil_f2) #Ljung-Box test
z test of coefficients:
Estimate Std. Error z value Pr(>|z|)
ar1 1.788412 0.029059 61.5442 < 2.2e-16 ***
ar2 -0.869018 0.028652 -30.3296 < 2.2e-16 ***
ma1 -0.702667 0.055707 -12.6136 < 2.2e-16 ***
ma2 -0.321132 0.051460 -6.2404 4.364e-10 ***
ma3 0.086180 0.053113 1.6226 0.1047
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
checkresiduals(soil_f2) #residuals
Ljung-Box test
data: Residuals from ARIMA(2,1,3)
Q* = 7.709, df = 5, p-value = 0.173
Model df: 5. Total lags used: 10
#Forcast with model 2
soil_1[1657:2157] %>% forecast(model=soil_f2 ,h=48) %>% autoplot()
#Forecast with model 2 vs actual
soil_1_forecast <- forecast(soil_1[1657:2157],model=soil_f2, h = 48)
autoplot(soil_1_forecast) + autolayer(soil_1_forecast, series='Predicted') + autolayer(ts(soil_1[1657:2205]), series='Actual') + ylab('Average Air Temperature') + xlab('Hours')
###Chase###
options(warn=-1) #make warnings shut up
chart.Correlation(ch[,c(2:9)], histogram=F, pch=19,method = c('spearman'))
soil_2<-ch$Avg.Soil.Temp..F. #Average Soil Temperature in F
air_2<-ch$Avg.Air.Temp..F. #Average Air Temperature in F
#Plot Soil Temp and look at lagged differences
autoplot(ts(soil_2)) + ylab('Average Soil Temperature') + xlab('Hours') + ggtitle('Chase: 1/1/2020 12:57 - 12/30/2020 11:57')
ts(soil_2) %>% diff(lag=24) %>% ggtsdisplay()
#Plot Air Temp and look at lagged differences
autoplot(ts(air_2)) + ylab('Average Air Temperature') + xlab('Hours') + ggtitle('Chase: 1/1/2020 12:57 - 12/30/2020 11:57')
ts(air_2) %>% diff(lag=24) %>% ggtsdisplay()
#Construct ARIMA with regression
soil_f1_2<-auto.arima(soil_2[8000:8760],xreg = air_2[8000:8760] ,stationary = FALSE,seasonal=TRUE)
soil_f1_2 #model
Series: soil_2[8000:8760]
Regression with ARIMA(3,1,2) errors
Coefficients:
ar1 ar2 ar3 ma1 ma2 xreg
2.2215 -1.6661 0.4007 -0.3848 -0.4817 -0.0184
s.e. 0.0549 0.1007 0.0522 0.0540 0.0414 0.0038
sigma^2 estimated as 0.05199: log likelihood=45.97
AIC=-77.95 AICc=-77.8 BIC=-45.51
coeftest(soil_f1_2) #Ljung-Box test
z test of coefficients:
Estimate Std. Error z value Pr(>|z|)
ar1 2.221510 0.054934 40.4395 < 2.2e-16 ***
ar2 -1.666099 0.100726 -16.5408 < 2.2e-16 ***
ar3 0.400734 0.052201 7.6768 1.631e-14 ***
ma1 -0.384822 0.053953 -7.1325 9.855e-13 ***
ma2 -0.481742 0.041414 -11.6325 < 2.2e-16 ***
xreg -0.018398 0.003815 -4.8226 1.417e-06 ***
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
checkresiduals(soil_f1_2) #residuals
Ljung-Box test
data: Residuals from Regression with ARIMA(3,1,2) errors
Q* = 16.508, df = 4, p-value = 0.002408
Model df: 6. Total lags used: 10
soil_f2_2<-auto.arima(soil_2[8000:8760],stationary = FALSE, seasonal=TRUE)
soil_f2_2 #model
Series: soil_2[8000:8760]
ARIMA(4,0,1) with non-zero mean
Coefficients:
ar1 ar2 ar3 ar4 ma1 mean
2.1349 -1.2701 -0.1061 0.2251 0.6464 49.4413
s.e. 0.1050 0.2765 0.2605 0.0887 0.0919 0.8403
sigma^2 estimated as 0.05372: log likelihood=30.77
AIC=-47.53 AICc=-47.38 BIC=-15.09
coeftest(soil_f2_2) #Ljung-Box test
z test of coefficients:
Estimate Std. Error z value Pr(>|z|)
ar1 2.134895 0.104989 20.3345 < 2.2e-16 ***
ar2 -1.270115 0.276547 -4.5928 4.374e-06 ***
ar3 -0.106067 0.260475 -0.4072 0.68386
ar4 0.225093 0.088734 2.5367 0.01119 *
ma1 0.646393 0.091856 7.0370 1.964e-12 ***
intercept 49.441333 0.840312 58.8369 < 2.2e-16 ***
---
Signif. codes: 0 ‘***’ 0.001 ‘**’ 0.01 ‘*’ 0.05 ‘.’ 0.1 ‘ ’ 1
checkresiduals(soil_f2_2) #residuals
Ljung-Box test
data: Residuals from ARIMA(4,0,1) with non-zero mean
Q* = 6.8705, df = 4, p-value = 0.1429
Model df: 6. Total lags used: 10
#Forecast with model 2
soil_2[8000:8760] %>% forecast(model=soil_f2_2,h=48) %>% autoplot()
#Forecast with model 2 vs actual
soil_2_forecast <- forecast(soil_2[8000:8760],model=soil_f2_2, h = 48)
autoplot(soil_2_forecast) + autolayer(soil_2_forecast, series='Predicted') + autolayer(ts(soil_2[8000:8808]), series='Actual') + ylab('Average Air Temperature') + xlab('Hours')
gc() #clear space for memory
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 2633341 140.7 5082215 271.5 5082215 271.5
Vcells 4413728 33.7 17090099 130.4 17090099 130.4
memory.limit(size=10000000000000)
[1] 1e+13
takens<-function(x,acf,ami,max_dim){
gc()
memory.limit(size=10000000000000)
old_par <- par(mfrow = c(1,2))
#tau_acf <- timeLag(x, technique = "acf",
#lag.max = acf, do.plot = F)
tau_ami <- timeLag(x, technique = "ami",
lag.max = ami, do.plot = F)
emb_dim <- estimateEmbeddingDim(x,time.lag = tau_ami,
max.embedding.dim = max_dim)
taken <- buildTakens(x, embedding.dim = emb_dim,
time.lag = tau_ami)
return(taken)
}
plotter1<-function(embeded,x,y,z){
col_day<-colorRampPalette(c("darkgreen","red","green"))
col_day<-col_day(24)
col_year<-colorRampPalette(c("red","blue"))
col_year<-col_year(length(embeded[,1]))
plt<-scatter3d(x = embeded[,x],
y = embeded[,y],
z = embeded[,z],
point.col = col_day[seq(1:24)],
surface = F)
aspect3d(1,1,1)
}
plotter2<-function(embeded,x,y,z){
col_day<-colorRampPalette(c("darkgreen","red","green"))
col_day<-col_day(24)
col_year<-colorRampPalette(c("red","blue"))
col_year<-col_year(length(embeded[,1]))
scatter3d(x = embeded[,x],
y = embeded[,y],
z = embeded[,z],
point.col = col_year[seq(1:length(embeded[,1]))],
surface = F)
aspect3d(1,1,1)
}
Taken’s theorem for time delay embedding
tak_b<-takens(soil_1,100,100,15)
plotter1(tak_b,1,2,3)
Loading required namespace: mgcv
rglwidget()
plotter2(tak_b,1,2,3)
rglwidget()
plotter1(tak_b,4,5,6)
rglwidget()
plotter2(tak_b,4,5,6)
rglwidget()
plotter1(tak_b,1,3,5)
rglwidget()
plotter2(tak_b,1,3,5)
rglwidget()
plotter1(tak_b,2,4,6)
rglwidget()
plotter2(tak_b,2,4,6)
rglwidget()
###Chase average soil temp
gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 2713001 144.9 5082215 271.5 5082215 271.5
Vcells 4614906 35.3 17090099 130.4 17090099 130.4
memory.limit(size=10000000000000)
[1] 1e+13
tak_c<-takens(soil_2,100,100,15)
plotter1(tak_c,1,2,3)
rglwidget()
plotter2(tak_c,1,2,3)
rglwidget()
gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 2713979 145.0 5082216 271.5 5082216 271.5
Vcells 4759427 36.4 17090099 130.4 17090099 130.4
memory.limit(size=10000000000000)
[1] 1e+13
The dynamics for are present within the embedding. For BenHur, every other embedding (1,3,5 or 2,4,6) shows a trapezoidal attractor. The first three embedding for the Chase data, which contains an entire year exhibits this attractor with the first three embedding. Notice when the first difference is taken, the topological structure has change (what is this called?), but there appears to still be a preservation of the attractor.
#remove stationarity
#-1 lag difference
soil_1_diff<-diff(soil_1,lag=1)
plot(soil_1_diff,type = 'l')
gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 2712231 144.9 5082216 271.5 5082216 271.5
Vcells 4668699 35.7 17090099 130.4 17090099 130.4
memory.limit(size=10000000000000)
[1] 1e+13
tak_b_2<-takens(soil_1_diff,100,100,15)
plotter1(tak_b_2,1,3,5) #now you just get a blob still clustered though
#there is still an attractor at the center
#remove stationarity
#-1 lag difference
soil_2_diff<-diff(soil_2,lag=1)
plot(soil_2_diff,type = 'l')
gc()
used (Mb) gc trigger (Mb) max used (Mb)
Ncells 2713144 144.9 5082216 271.5 5082216 271.5
Vcells 4731153 36.1 17090099 130.4 17090099 130.4
memory.limit(size=10000000000000)
[1] 1e+13
tak_c_2<-takens(soil_2_diff,100,100,15)
plotter1(tak_c_2,1,2,3) #now you just get a blob still clustered though
Acf(soil_1)
Acf(soil_1_diff)
Acf(soil_2)
Acf(soil_2_diff)
#very much indicates this is not a random walk
Box.test(soil_1_diff)
Box-Pierce test
data: soil_1_diff
X-squared = 1678.7, df = 1, p-value < 2.2e-16
#hourly changes in tempreture is deffinatly not a random walk correlated with it's previous times.
rm(tak_b_2,tak_c_2)
reticulate::repl_python()
Python 3.6.13 (C:/Users/AppData/Local/r-miniconda/envs/r-reticulate/python.exe)
Reticulate 1.20 REPL -- A Python interpreter in R.
import matplotlib.pyplot as plt
from mpl_toolkits.mplot3d import Axes3D
from matplotlib.cm import rainbow
import numpy as np
from scipy.integrate import odeint
from scipy.io import loadmat
import pysindy as ps
import plotly.graph_objects as go
ModuleNotFoundError: No module named 'plotly'
from sklearn import linear_model #for lasso
dt = 1 # notice that this is 1 hour
#if smaller make sure to notice the coefficients will be on this scale
train=np.delete(r.tak_c, range(3, 8, 1), axis=1) #Use Chase as training set
#Use every other column for Benhur as the testing set
test=np.delete(r.tak_b, range(1, 9, 2), axis=1)
test=np.delete(test, range(3, 5, 1), axis=1)
#r.tak_b[1]
#test[1]
#if you try it with the differences data the coeff are zero with lasso
np.random.seed(1)
lasso_optimizer =linear_model.Lasso(alpha=1, max_iter=2000,tol =0.5, fit_intercept=False)
model = ps.SINDy(optimizer=lasso_optimizer)
model.fit(train, t=dt)
SINDy(differentiation_method=FiniteDifference(),
feature_library=PolynomialLibrary(), feature_names=['x0', 'x1', 'x2'],
optimizer=Lasso(alpha=1, fit_intercept=False, max_iter=2000, tol=0.5))
model.print()
x0' = -0.126 x0 + 0.199 x1 + -0.073 x2 + -0.012 x0^2 + 0.024 x0 x1 + 0.002 x0 x2 + -0.024 x1 x2 + 0.010 x2^2
x1' = -0.099 x0 + -0.035 x1 + 0.132 x2 + 0.007 x0^2 + 0.006 x0 x1 + -0.020 x0 x2 + -0.014 x1^2 + 0.023 x1 x2 + -0.002 x2^2
x2' = -0.087 x1 + 0.090 x2 + 0.001 x0^2 + -0.025 x0 x1 + 0.025 x0 x2 + 0.014 x1^2 + -0.004 x1 x2 + -0.011 x2^2
model.score(train)
0.7524666374467947
model.score(test)
0.473510746561548
x0=model.equations()[0]
x0
'-0.126 x0 + 0.199 x1 + -0.073 x2 + -0.012 x0^2 + 0.024 x0 x1 + 0.002 x0 x2 + -0.024 x1 x2 + 0.010 x2^2'
#remember these are the derivatives (i think)
pred=model.predict(test)
pred=np.delete(pred,range(1,3,1),1) #derivatives for t
quit
#py$pred is the derivatives from time embedings
expected<-soil_1[1:2172]+py$pred[1:2172]#x0+dt
#soil_1[2:1001] x1 true
plot(expected,soil_1[2:2173]) #x0+dt~x1 vs x1 true
autoplot(ts(expected)) + autolayer(ts(ts(expected)), series='Predicted') +autolayer(ts(soil_1[2:2173]), series='Actual') + ylab('Average Air Temperature') + xlab('Hours')
These are fitted values in a way. But note, fitted values are not true forecasts.
res<-soil_1[2:2173]-expected
hist(res)
plot(res,type='l')
Acf(res)
Box.test(res,lag=1, fitdf=0, type="Lj")
Box-Ljung test
data: res
X-squared = 1301.7, df = 1, p-value < 2.2e-16
#residuals are not coming from white noise
error ~ N(0,1)
reticulate::repl_python()
Python 3.6.13 (C:/Users/AppData/Local/r-miniconda/envs/r-reticulate/python.exe)
Reticulate 1.20 REPL -- A Python interpreter in R.
t = np.arange(0,48, dt)
x0=np.array([86.4,85.0,83.5])
#x0=test
simulation=model.simulate(x0,t)
quit
plot(py$simulation)
plot(py$simulation[,2]) #only giving it 3 coords from time delay
plotter1(py$simulation,1,2,3)
rglwidget()
autoplot(soil_1_forecast) + autolayer(ts(c(soil_1[1657:2157],py$simulation[,2])), series='Predicted') +autolayer(ts(soil_1[1657:2205]), series='Actual') + ylab('Average Air Temperature') + xlab('Hours')
library(reticulate)
reticulate::py_config()
python: C:/Users/AppData/Local/r-miniconda/envs/r-reticulate/python.exe
libpython: C:/Users/AppData/Local/r-miniconda/envs/r-reticulate/python36.dll
pythonhome: C:/Users/AppData/Local/r-miniconda/envs/r-reticulate
version: 3.6.13 (default, Sep 23 2021, 07:38:49) [MSC v.1916 64 bit (AMD64)]
Architecture: 64bit
numpy: C:/Users/AppData/Local/r-miniconda/envs/r-reticulate/Lib/site-packages/numpy
numpy_version: 1.19.5
NOTE: Python version was forced by use_python function
reticulate::repl_python()
Python 3.6.13 (C:/Users/AppData/Local/r-miniconda/envs/r-reticulate/python.exe)
Reticulate 1.20 REPL -- A Python interpreter in R.
import pandas as pd
import matplotlib.pyplot as plt
import numpy as np
import time
from pydmd import HODMD
#data = r.soil_1[1500:]
data = r.soil_1_diff[1500:]
### you need to remove outliers
###notice the anomly between 100 and 250 (when r.soil[0:]), throws off predctions it fits best to the first set of observation
end_time = 400
y = data[0:end_time]
hodmd = HODMD(svd_rank=0, tlsq_rank=0, exact=True, opt=True, d=200).fit(y)
C:\Users\AppData\Local\R-MINI~1\envs\R-RETI~1\lib\site-packages\pydmd\hodmd.py:141: UserWarning: The parameter 'svd_rank_extra=0' has been ignored because
the given system is a scalar function
self.svd_rank_extra
pd.DataFrame(hodmd.atilde) #A tilda
0 1 2 ... 15 16 17
0 0.966393 0.262622 0.004219 ... 0.002201 0.005561 0.004361
1 -0.256206 0.964512 0.005899 ... 0.004091 0.014343 0.013855
2 0.002223 -0.003186 0.867887 ... 0.005789 0.007589 0.012364
3 -0.000529 0.004100 -0.499650 ... -0.000365 0.001571 -0.004773
4 0.000195 0.000290 0.001891 ... -0.005073 -0.013737 -0.009778
5 -0.001128 0.002494 -0.003615 ... -0.005880 0.003538 -0.011843
6 -0.000088 -0.004539 0.000484 ... -0.006324 0.003898 -0.012559
7 0.000981 -0.000279 0.002172 ... -0.002866 -0.027815 -0.013066
8 -0.000417 -0.001933 -0.000461 ... -0.011686 0.016602 -0.015795
9 -0.000896 0.000753 -0.003099 ... 0.011951 0.067790 0.027200
10 0.000366 -0.001195 0.001314 ... 0.005388 -0.008134 0.010126
11 0.000448 0.001195 -0.000875 ... 0.007813 -0.019185 -0.011364
12 0.000252 -0.000269 0.001287 ... -0.038669 -0.090441 -0.042728
13 0.000659 -0.000528 0.001640 ... 0.076242 -0.039534 0.069939
14 -0.000398 -0.000225 -0.000443 ... -0.639289 -0.014080 -0.116019
15 -0.000296 0.000564 -0.001479 ... 0.750925 0.141393 -0.018804
16 0.000722 -0.001909 0.001870 ... -0.136376 0.870767 0.399528
17 -0.000551 0.001794 -0.002964 ... -0.017645 -0.388700 0.897689
[18 rows x 18 columns]
#pd.DataFrame(hodmd.dynamics)
#pd.DataFrame(hodmd.modes)
plt.clf()
hodmd.plot_eigs()
#m = 24
m=168
hodmd.dmd_time['tend'] = end_time + m
#plt.plot(hodmd.original_timesteps, y, '.', label='y')
plt.clf()
plt.plot(hodmd.original_timesteps, y, '-', label='Training Set')
[<matplotlib.lines.Line2D object at 0x00000138BACDC080>]
plt.plot(np.array([k+end_time for k in range(m)]), data[end_time:end_time+m], '-', label='Real Outcome')
[<matplotlib.lines.Line2D object at 0x00000138BACDC4A8>]
plt.plot(hodmd.dmd_timesteps, hodmd.reconstructed_data[0].real, '--', label='DMD Output')
[<matplotlib.lines.Line2D object at 0x00000138BACDC748>]
plt.legend()
<matplotlib.legend.Legend object at 0x00000138BACDCB70>
plt.show()